593 research outputs found

    Genuine Counterfactual Communication with a Nanophotonic Processor

    Full text link
    In standard communication information is carried by particles or waves. Counterintuitively, in counterfactual communication particles and information can travel in opposite directions. The quantum Zeno effect allows Bob to transmit a message to Alice by encoding information in particles he never interacts with. The first suggested protocol not only required thousands of ideal optical components, but also resulted in a so-called "weak trace" of the particles having travelled from Bob to Alice, calling the scalability and counterfactuality of previous proposals and experiments into question. Here we overcome these challenges, implementing a new protocol in a programmable nanophotonic processor, based on reconfigurable silicon-on-insulator waveguides that operate at telecom wavelengths. This, together with our telecom single-photon source and highly-efficient superconducting nanowire single-photon detectors, provides a versatile and stable platform for a high-fidelity implementation of genuinely trace-free counterfactual communication, allowing us to actively tune the number of steps in the Zeno measurement, and achieve a bit error probability below 1%, with neither post-selection nor a weak trace. Our demonstration shows how our programmable nanophotonic processor could be applied to more complex counterfactual tasks and quantum information protocols.Comment: 6 pages, 4 figure

    Deglaciation of Fennoscandia

    Get PDF
    To provide a new reconstruction of the deglaciation of the Fennoscandian Ice Sheet, in the form of calendar-year time-slices, which are particularly useful for ice sheet modelling, we have compiled and synthesized published geomorphological data for eskers, ice-marginal formations, lineations, marginal meltwater channels, striae, ice-dammed lakes, and geochronological data from radiocarbon, varve, optically-stimulated luminescence, and cosmogenic nuclide dating. This 25 is summarized as a deglaciation map of the Fennoscandian Ice Sheet with isochrons marking every 1000 years between 22 and 13 cal kyr BP and every hundred years between 11.6 and final ice decay after 9.7 cal kyr BP. Deglaciation patterns vary across the Fennoscandian Ice Sheet domain, reflecting differences in climatic and geomorphic settings as well as ice sheet basal thermal conditions and terrestrial versus marine margins. For example, the ice sheet margin in the high-precipitation coastal setting of the western sector responded sensitively to climatic variations leaving a detailed record of prominent moraines and ice-marginal deposits in many fjords and coastal valleys. Retreat rates across the southern sector differed between slow retreat of the terrestrial margin in western and southern Sweden and rapid retreat of the calving ice margin in the Baltic Basin. Our reconstruction is consistent with much of the published research. However, the synthesis of a large amount of existing and new data support refined reconstructions in some areas. For example, we locate the LGM extent of the ice sheet in northwestern Russia further east than previously suggested and conclude that it occurred at a later time than the rest of the ice sheet, at around 17-15 cal kyr BP, and propose a slightly different chronology of moraine formation over southern Sweden based on improved correlations of moraine segments using new LiDAR data and tying the timing of moraine formation to Greenland ice core cold stages. Retreat rates vary by as much as an order of magnitude in different sectors of the ice sheet, with the lowest rates on the high-elevation and maritime Norwegian margin. Retreat rates compared to the climatic information provided by the Greenland ice core record show a general correspondence between retreat rate and climatic forcing, although a close match between retreat rate and climate is unlikely because of other controls, such as topography and marine versus terrestrial margins. Overall, the time slice reconstructions of Fennoscandian Ice Sheet deglaciation from 22 to 9.7 cal kyr BP provide an important dataset for understanding the contexts that underpin spatial and temporal patterns in retreat of the Fennoscandian Ice Sheet, and are an important resource for testing and refining ice sheet models

    Construction and analysis of causally dynamic hybrid bond graphs

    Get PDF
    Engineering systems are frequently abstracted to models with discontinuous behaviour (such as a switch or contact), and a hybrid model is one which contains continuous and discontinuous behaviours. Bond graphs are an established physical modelling method, but there are several methods for constructing switched or ‘hybrid’ bond graphs, developed for either qualitative ‘structural’ analysis or efficient numerical simulation of engineering systems. This article proposes a general hybrid bond graph suitable for both. The controlled junction is adopted as an intuitive way of modelling a discontinuity in the model structure. This element gives rise to ‘dynamic causality’ that is facilitated by a new bond graph notation. From this model, the junction structure and state equations are derived and compared to those obtained by existing methods. The proposed model includes all possible modes of operation and can be represented by a single set of equations. The controlled junctions manifest as Boolean variables in the matrices of coefficients. The method is more compact and intuitive than existing methods and dispenses with the need to derive various modes of operation from a given reference representation. Hence, a method has been developed, which can reach common usage and form a platform for further study

    The mixed problem in L^p for some two-dimensional Lipschitz domains

    Get PDF
    We consider the mixed problem for the Laplace operator in a class of Lipschitz graph domains in two dimensions with Lipschitz constant at most 1. The boundary of the domain is decomposed into two disjoint sets D and N. We suppose the Dirichlet data, f_D has one derivative in L^p(D) of the boundary and the Neumann data is in L^p(N). We find conditions on the domain and the sets D and N so that there is a p_0>1 so that for p in the interval (1,p_0), we may find a unique solution to the mixed problem and the gradient of the solution lies in L^p

    A new approach for developing continuous age-depth models from dispersed chronologic data: applications to the Miocene Santa Cruz formation, Argentina

    Get PDF
    Traditional methods (linear regression, spline fitting) of age-depth modeling generate overly optimistic confidence intervals. Originally developed for C, Bayesian models (use of observations independent of chronology) allow the incorporation of prior information about superposition of dated horizons, stratigraphic position of undated points, and variations in sedimentology and sedimentation rate into model fitting. We modified the methodology of two Bayesian age depth models, Bchron (Haslett and Parnell, 2008) and OxCal (Ramsey, 2008) for use with U-Pb dates. Some practical implications of this approach include: a) model age uncertainties increase in intervals that lack closely spaced age constraints; b) models do not assume normal distributions, allowing for the non-symmetric uncertainties of sometimes complex crystal age probability functions in volcanic tuffs; c) superpositional constraints can objectively reject some cases of zircon inheritance and mitigate apparent age complexities. We use this model to produce an age-depth model with continuous and realistic uncertainties, for the early Miocene Santa Cruz Formation (SCF), Argentina.Facultad de Ciencias Naturales y Muse

    A new approach for developing continuous age-depth models from dispersed chronologic data: applications to the Miocene Santa Cruz formation, Argentina

    Get PDF
    Traditional methods (linear regression, spline fitting) of age-depth modeling generate overly optimistic confidence intervals. Originally developed for C, Bayesian models (use of observations independent of chronology) allow the incorporation of prior information about superposition of dated horizons, stratigraphic position of undated points, and variations in sedimentology and sedimentation rate into model fitting. We modified the methodology of two Bayesian age depth models, Bchron (Haslett and Parnell, 2008) and OxCal (Ramsey, 2008) for use with U-Pb dates. Some practical implications of this approach include: a) model age uncertainties increase in intervals that lack closely spaced age constraints; b) models do not assume normal distributions, allowing for the non-symmetric uncertainties of sometimes complex crystal age probability functions in volcanic tuffs; c) superpositional constraints can objectively reject some cases of zircon inheritance and mitigate apparent age complexities. We use this model to produce an age-depth model with continuous and realistic uncertainties, for the early Miocene Santa Cruz Formation (SCF), Argentina.Facultad de Ciencias Naturales y Muse

    An Experimental Investigation of Colonel Blotto Games

    Get PDF
    "This article examines behavior in the two-player, constant-sum Colonel Blotto game with asymmetric resources in which players maximize the expected number of battlefields won. The experimental results support all major theoretical predictions. In the auction treatment, where winning a battlefield is deterministic, disadvantaged players use a 'guerilla warfare' strategy which stochastically allocates zero resources to a subset of battlefields. Advantaged players employ a 'stochastic complete coverage' strategy, allocating random, but positive, resource levels across the battlefields. In the lottery treatment, where winning a battlefield is probabilistic, both players divide their resources equally across all battlefields." (author's abstract)"Dieser Artikel untersucht das Verhalten von Individuen in einem 'constant-sum Colonel Blotto'-Spiel zwischen zwei Spielern, bei dem die Spieler mit unterschiedlichen Ressourcen ausgestattet sind und die erwartete Anzahl gewonnener Schlachtfelder maximieren. Die experimentellen Ergebnisse bestätigen alle wichtigen theoretischen Vorhersagen. Im Durchgang, in dem wie in einer Auktion der Sieg in einem Schlachtfeld deterministisch ist, wenden die Spieler, die sich im Nachteil befinden, eine 'Guerillataktik' an, und verteilen ihre Ressourcen stochastisch auf eine Teilmenge der Schlachtfelder. Spieler mit einem Vorteil verwenden eine Strategie der 'stochastischen vollständigen Abdeckung', indem sie zufällig eine positive Ressourcenmenge auf allen Schlachtfeldern positionieren. Im Durchgang, in dem sich der Gewinn eines Schlachtfeldes probabilistisch wie in einer Lotterie bestimmt, teilen beide Spieler ihre Ressourcen gleichmäßig auf alle Schlachtfelder auf." (Autorenreferat

    Exclusivity and exclusion on platform markets

    Get PDF
    We examine conditions under which an exclusive license granted by the upstream producer of a component that some consumers regard as essential to one of two potential suppliers of a downstream platform market can make the unlicensed supplier unprofitable, although both firms would be profitable if both were licensed. If downstream varieties are close substitutes, an exclusive license need not be exclusionary. If downstream varieties are highly differentiated, an exclusive license is exclusionary, but it is not in the interest of the upstream firm to grant an exclusive license. For intermediate levels of product differentiation, an exclusive license is exclusionary and maximizes the upstream firm’s payoff
    • …
    corecore